Improved Storage Capacity of Hebbian Learning Attractor Neural Network with Bump Formations
نویسندگان
چکیده
Recently, bump formations in attractor neural networks with distance dependent connectivities has become of increasing interest for investigation in the field of biological and computational neuroscience. Although the distance dependent connectivity is common in biological networks, a common fault of these network is the sharp drop of the number of patterns p that can remembered, when the activity changes from global to bump-like, than effectively makes these networks low effective. In this paper we represent a bump-based recursive network specially designed in order to increase its capacity, which is comparable with that of randomly connected sparse network. To this aim, we have tested a selection of 700 natural images on a network with N = 64K neurons with connectivity per neuron C. We have shown that the capacity of the network is of order of C, that is in accordance with the capacity of highly diluted network. Preserving the number of connections per neuron, a non-trivial behavior with the radius of the connectivity has been observed. Our results show that the decrement of the capacity of the bumpy network can be avoided.
منابع مشابه
Bump formation in a binary attractor neural network.
The conditions for the formation of local bumps in the activity of binary attractor neural networks with spatially dependent connectivity are investigated. We show that these formations are observed when asymmetry between the activity during the retrieval and learning is imposed. An analytical approximation for the order parameters is derived. The corresponding phase diagram shows a relatively ...
متن کاملNetwork Capacity for Latent Attractor Computation
Attractor networks have been one of the most successful paradigms in neural computation and have been used as models of computation in the nervous system Many experimentally observed phenomena such as coherent population codes contextual representations and replay of learned neural activity patterns are explained well by attractor dynamics Recently we proposed a paradigm called latent attractor...
متن کاملStochastic Dynamics and High Capacity Associative Memories
The addition of noise to the deterministic Hopfield network, trained with one shot Hebbian learning, is known to bring benefits in the elimination of spurious attractors. This paper extends the analysis to learning rules that have a much higher capacity. The relative energy of desired and spurious attractors is reported and the affect of adding noise to the dynamics is empirically investigated....
متن کاملLearning in sparse attractor networks with inhibition
Attractor networks are important models for brain functions on a behavioral and physiological level, but learning on sparse patterns has not been fully explained. Here we show that the inclusion of the activity dependent effect of an inhibitory pool in Hebbian learning can accomplish learning of stable sparse attractors in both, continuous attractor and point attractor neural networks.
متن کاملCAM Storage of Analog Patterns and Continuous Sequences with 3N2 Weights
Frank Eeckman Lawrence Livermore National Laboratory, P.O. Box 808 (L-426), Livermore, Ca. 94550 A simple architecture and algorithm for analytically guaranteed associative memory storage of analog patterns, continuous sequences, and chaotic attractors in the same network is described. A matrix inversion determines network weights, given prototype patterns to be stored. There are N units of cap...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006